Your browser doesn't support javascript.
loading
: 20 | 50 | 100
1 - 10 de 10
2.
J Vis ; 24(4): 3, 2024 Apr 01.
Article En | MEDLINE | ID: mdl-38558158

The sudden onset of a visual object or event elicits an inhibition of eye movements at latencies approaching the minimum delay of visuomotor conductance in the brain. Typically, information presented via multiple sensory modalities, such as sound and vision, evokes stronger and more robust responses than unisensory information. Whether and how multisensory information affects ultra-short latency oculomotor inhibition is unknown. In two experiments, we investigate smooth pursuit and saccadic inhibition in response to multisensory distractors. Observers tracked a horizontally moving dot and were interrupted by an unpredictable visual, auditory, or audiovisual distractor. Distractors elicited a transient inhibition of pursuit eye velocity and catch-up saccade rate within ∼100 ms of their onset. Audiovisual distractors evoked stronger oculomotor inhibition than visual- or auditory-only distractors, indicating multisensory response enhancement. Multisensory response enhancement magnitudes were equal to the linear sum of responses to component stimuli. These results demonstrate that multisensory information affects eye movements even at ultra-short latencies, establishing a lower time boundary for multisensory-guided behavior. We conclude that oculomotor circuits must have privileged access to sensory information from multiple modalities, presumably via a fast, subcortical pathway.


Brain , Pursuit, Smooth , Humans , Reaction Time/physiology , Brain/physiology , Saccades , Memory , Photic Stimulation/methods
3.
bioRxiv ; 2023 Oct 02.
Article En | MEDLINE | ID: mdl-37873151

How features of complex visual patterns combine to drive perception and eye movements is not well understood. We simultaneously assessed human observers' perceptual direction estimates and ocular following responses (OFR) evoked by moving plaids made from two summed gratings with varying contrast ratios. When the gratings were of equal contrast, observers' eye movements and perceptual reports followed the motion of the plaid pattern. However, when the contrasts were unequal, eye movements and reports during early phases of the OFR were biased toward the direction of the high-contrast grating component; during later phases, both responses more closely followed the plaid pattern direction. The shift from component- to pattern-driven behavior resembles the shift in tuning seen under similar conditions in neuronal responses recorded from monkey MT. Moreover, for some conditions, pattern tracking and perceptual reports were correlated on a trial-by-trial basis. The OFR may therefore provide a precise behavioural read-out of the dynamics of neural motion integration for complex visual patterns.

4.
eNeuro ; 10(8)2023 08.
Article En | MEDLINE | ID: mdl-37591732

Natural movements, such as catching a ball or capturing prey, typically involve multiple senses. Yet, laboratory studies on human movements commonly focus solely on vision and ignore sound. Here, we ask how visual and auditory signals are integrated to guide interceptive movements. Human observers tracked the brief launch of a simulated baseball, randomly paired with batting sounds of varying intensities, and made a quick pointing movement at the ball. Movement end points revealed systematic overestimation of target speed when the ball launch was paired with a loud versus a quiet sound, although sound was never informative. This effect was modulated by the availability of visual information; sounds biased interception when the visual presentation duration of the ball was short. Amplitude of the first catch-up saccade, occurring ∼125 ms after target launch, revealed early integration of audiovisual information for trajectory estimation. This sound-induced bias was reversed during later predictive saccades when more visual information was available. Our findings suggest that auditory and visual signals are integrated to guide interception and that this integration process must occur early at a neural site that receives auditory and visual signals within an ultrashort time span.


Movement , Saccades , Humans , Sensation , Sound
5.
eNeuro ; 9(5)2022.
Article En | MEDLINE | ID: mdl-36635938

Objects in our visual environment often move unpredictably and can suddenly speed up or slow down. The ability to account for acceleration when interacting with moving objects can be critical for survival. Here, we investigate how human observers track an accelerating target with their eyes and predict its time of reappearance after a temporal occlusion by making an interceptive hand movement. Before occlusion, observers smoothly tracked the accelerating target with their eyes. At the time of occlusion, observers made a predictive saccade to the location where they subsequently intercepted the target with a quick pointing movement. We tested how observers integrated target motion information by comparing three alternative models that describe time-to-contact (TTC) based on the (1) final target velocity sample before occlusion, (2) average target velocity before occlusion, or (3) final target velocity and the rate of target acceleration. We show that observers were able to accurately track the accelerating target with visually-guided smooth pursuit eye movements. However, the timing of the predictive saccade and manual interception revealed inability to act on target acceleration when predicting TTC. Instead, interception timing was best described by the final velocity model that relies on extrapolating the last available target velocity sample before occlusion. Moreover, predictive saccades and manual interception showed similar insensitivity to target acceleration and were correlated on a trial-by-trial basis. These findings provide compelling evidence for the failure of integrating target acceleration into predictive models of target motion that drive both interceptive eye and hand movements.


Motion Perception , Humans , Saccades , Pursuit, Smooth , Movement , Motion , Photic Stimulation
6.
Psychon Bull Rev ; 28(6): 1982-1990, 2021 Dec.
Article En | MEDLINE | ID: mdl-34159531

Visual working memory (VWM) is typically found to be severely limited in capacity, but this limitation may be ameliorated by providing familiar objects that are associated with knowledge stored in long-term memory. However, comparing meaningful and meaningless stimuli usually entails a confound, because different types of objects also tend to vary in terms of their inherent perceptual complexity. The current study therefore aimed to dissociate stimulus complexity from object meaning in VWM. To this end, identical stimuli - namely, simple color-shape conjunctions - were presented, which either resembled meaningful configurations ("real" European flags), or which were rearranged to form perceptually identical but meaningless ("fake") flags. The results revealed complexity estimates for "real" and "fake" flags to be higher than for unicolor baseline stimuli. However, VWM capacity for real flags was comparable to the unicolor baseline stimuli (and substantially higher than for fake flags). This shows that relatively complex, yet meaningful "real" flags reveal a VWM capacity that is comparable to rather simple, unicolored memory items. Moreover, this "nationality" benefit was related to individual flag recognition performance, thus showing that VWM depends on object knowledge.


Ethnicity , Memory, Short-Term , Humans , Knowledge , Memory, Long-Term , Recognition, Psychology , Visual Perception
7.
Vision Res ; 183: 81-90, 2021 06.
Article En | MEDLINE | ID: mdl-33743442

When we catch a moving object in mid-flight, our eyes and hands are directed toward the object. Yet, the functional role of eye movements in guiding interceptive hand movements is not yet well understood. This review synthesizes emergent views on the importance of eye movements during manual interception with an emphasis on laboratory studies published since 2015. We discuss the role of eye movements in forming visual predictions about a moving object, and for enhancing the accuracy of interceptive hand movements through feedforward (extraretinal) and feedback (retinal) signals. We conclude by proposing a framework that defines the role of human eye movements for manual interception accuracy as a function of visual certainty and object motion predictability.


Eye Movements , Motion Perception , Hand , Humans , Movement , Psychomotor Performance , Pursuit, Smooth , Retina , Saccades
8.
Cortex ; 133: 133-148, 2020 12.
Article En | MEDLINE | ID: mdl-33120191

Attention shifts that precede goal-directed eye and hand movements are regarded as markers of motor target selection. Whether effectors compete for a single, shared attentional resource during simultaneous eye-hand movements or whether attentional resources can be allocated independently towards multiple target locations is controversially debated. Independent, effector-specific target selection mechanisms underlying parallel allocation of visuospatial attention to saccade and reach targets would predict an increase of the overall attention capacity with the number of active effectors. We test this hypothesis in a modified Theory of Visual Attention (TVA; Bundesen, 1990) paradigm. Participants reported briefly presented letters during eye, hand, or combined eye-hand movement preparation to centrally cued locations. Modeling the data according to TVA allowed us to assess both the overall attention capacity and the deployment of visual attention to individual locations in the visual work space. In two experiments, we show that attention is predominantly allocated to the motor targets-without pronounced competition between effectors. The parallel benefits at eye and hand targets, however, have concomitant costs at non-motor locations, and the overall attention capacity does not increase by the simultaneous recruitment of both effector systems. Moreover, premotor shifts of attention dominate over voluntary deployment of processing resources, yielding severe impairments of voluntary attention allocation. We conclude that attention shifts to multiple effector targets without mutual competition given that sufficient processing resources can be withdrawn from movement-irrelevant locations.


Eye Movements , Hand , Cues , Humans , Movement , Photic Stimulation , Reaction Time , Saccades
9.
eNeuro ; 6(5)2019.
Article En | MEDLINE | ID: mdl-31488551

In situations requiring immediate action, humans can generate visually-guided responses at remarkably short latencies. Here, to better understand the visual attributes that best evoke such rapid responses, we recorded upper limb muscle activity while participants performed visually-guided reaches towards Gabor patches composed of differing spatial frequencies (SFs). We studied reaches initiated from a stable posture (experiment 1, a static condition), or during on-line reach corrections to an abruptly displaced target (experiment 2, a dynamic condition). In both experiments, we detail the latency and prevalence of stimulus-locked responses (SLRs), which are brief bursts of EMG activity that are time-locked to target presentation rather than movement onset. SLRs represent the first wave of EMG recruitment influenced by target presentation, and enable quantification of rapid visuomotor transformations. In both experiments, reach targets composed of low SFs elicited the shortest latency and most prevalent SLRs, with SLR latency increasing and SLR prevalence decreasing for reach targets composed of progressively higher SFs. SLRs could be evoked in either the static or dynamic condition, and when present in experiment 2, were associated with shorter latency and larger magnitude corrections. The results in experiment 2 are consistent with a linkage between the forces produced by SLRs and the earliest portion of on-line reach corrections. Overall, our results demonstrate that stimuli composed of low SFs preferentially evoke the most rapid visuomotor responses that, in the context of rapidly correcting an on-going reaching movement, are associated with earlier and larger on-line reach corrections.


Arm/physiology , Movement/physiology , Photic Stimulation/methods , Psychomotor Performance/physiology , Reaction Time/physiology , Adult , Electromyography/methods , Female , Humans , Male , Random Allocation , Young Adult
10.
J Neurophysiol ; 118(1): 404-415, 2017 07 01.
Article En | MEDLINE | ID: mdl-28515287

In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object ("ball") with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated "hit zone." In two experiments (n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points.NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points.


Hand , Motion Perception , Psychomotor Performance , Pursuit, Smooth , Adult , Eye Movement Measurements , Female , Humans , Male , Photic Stimulation , Psychophysics
...